Convergence of the conjugate gradient method with unbounded operators

نویسندگان

چکیده

In the framework of inverse linear problems on infinite-dimensional Hilbert space, we prove convergence conjugate gradient iterates to an exact solution problem in most general case where self-adjoint, non-negative operator is unbounded and with minimal, technically unavoidable assumptions initial guess iterative algorithm. The proved always hold space norm (error convergence), as well at other levels regularity (energy norm, residual, etc.) depending iterates. We also discuss, both analytically through a selection numerical tests, main features differences our result compared case, already available literature, bounded.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

New results on the convergence of the conjugate gradient method

This paper is concerned with proving theoretical results related to the convergence of the Conjugate Gradient method for solving positive definite symmetric linear systems. New relations for ratios of the A-norm of the error and the norm of the residual are provided starting from some earlier results of Sadok [13]. These results use the well-known correspondence between the Conjugate Gradient m...

متن کامل

Global convergence of a modified spectral FR conjugate gradient method

A modified spectral PRP conjugate gradient method is presented for solving unconstrained optimization problems. The constructed search direction is proved to be a sufficiently descent direction of the objective function. With an Armijo-type line search to determinate the step length, a new spectral PRP conjugate algorithm is developed. Under some mild conditions, the theory of global convergenc...

متن کامل

Global Convergence of a Modified Liu-storey Conjugate Gradient Method

In this paper, we make a modification to the LS conjugate gradient method and propose a descent LS method. The method can generates sufficient descent direction for the objective function. We prove that the method is globally convergent with an Armijo-type line search. Moreover, under mild conditions, we show that the method is globally convergent if the Armijo line search or the Wolfe line sea...

متن کامل

Deterministic convergence of conjugate gradient method for feedforward neural networks

Conjugate gradient methods have many advantages in real numerical experiments, such as fast convergence and low memory requirements. This paper considers a class of conjugate gradient learning methods for backpropagation (BP) neural networks with three layers. We propose a new learning algorithm for almost cyclic BP neural networks based on PRP conjugate gradient method. We then establish the d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Operators and Matrices

سال: 2022

ISSN: ['1848-9974', '1846-3886']

DOI: https://doi.org/10.7153/oam-2022-16-05